Emotional Facial Expression Classification for Multimodal User Interfaces

نویسندگان

  • Eva Cerezo
  • Isabelle Hupont
چکیده

We present a simple and computationally feasible method to perform automatic emotional classification of facial expressions. We propose the use of 10 characteristic points (that are part of the MPEG4 feature points) to extract relevant emotional information (basically five distances, presence of wrinkles and mouth shape). The method defines and detects the six basic emotions (plus the neutral one) in terms of this information and has been fine-tuned with a data-base of 399 images. For the moment, the method is applied to static images. Application to sequences is being now developed. The extraction of such information about the user is of great interest for the development of new multimodal user interfaces.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Individualizing the New Interfaces: Extraction of User's Emotions from Facial Data

When developing new multimodal user interfaces emotional user in-fornation may be of great interest. In this paper we present a simple and computationally feasible method to perform automatic emotional classification of facial expressions. We propose the use of 10 characteristic points (that are part of the MPEG4 feature points) to extract relevant emotional information (basically five distance...

متن کامل

MAUI avatars: Mirroring the user's sensed emotions via expressive multi-ethnic facial avatars

2 Abstract— In this paper we describe the Multimodal Affective User Interface (MAUI) we created to capture its users' emotional physiological signals via wearable computers and visualize the categorized signals in terms of recognized emotion. MAUI aims at 1) giving feedback to the users about their emotional states via various modalities (e.g. mirroring the user's facial expressions and describ...

متن کامل

The Facial Expression Module

In current dialogue systems the use of speech as an input modality is common. But this modality is only one of those human beings use. In human–human interaction people use gestures to point or facial expressions to show their moods as well. To give modern systems a chance to read information from all modalities used by humans, these systems must have multimodal user interfaces. The SMARTKOM sy...

متن کامل

Machine Analysis of Facial Behaviour: Naturalistic & Dynamic Behaviour

A widely accepted prediction is that computing will move to the background, weaving itself into the fabric of our everyday living and projecting the human user into the foreground. To realize this goal, next-generation computing (a.k.a. pervasive computing, ambient intelligence, and human computing) will need to develop human-centred user interfaces that respond readily to naturally occurring, ...

متن کامل

Identity Emulation (IE): Bio-inspired Facial Expression Interfaces for Emotive Robots

Our facial expression robots use biomimetic structures, aesthetic design principles, and recent breakthroughs in elastomer material sciences to enact a sizable range of natural humanlike facial expressions. Applications of this class of human-robot interface devices will rise in relevance as humans and robots begin to have more sociable encounters in the coming years. The Identity Emulation pro...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006